Pay hackers to save lives | Electronic Frontier Foundation

2021-12-16 08:10:32 By : Ms. Lucy Lu

How can we make the Internet more secure? According to Tarah Wheeler, a guest on How EFF Fixes the Internet this week, part of the solution is incentives. As a security researcher with extensive experience in the hacker community, Tarah talked about how many companies boasted in responding to security disclosures. Along with EFF co-hosts Cindy Cohn and Danny O'Brien, Tarah also talked about how existing computer crime laws can be used to scare security researchers, rather than elevate them.

Click below to listen to the show immediately, or choose your podcast player:

Everything we do cannot be separated from computers-which means that computer security is vital to all aspects of our lives. Whether it is medical equipment or car navigation, better safety makes us safer. Note: We will hold a special live event with Tarah Wheeler on Thursday, December 9th to continue this conversation. Reply or learn more.

In this program, you will learn:

Tarah Wheeler is an information security director, social scientist in the field of international conflict, writer and poker player. She serves on the EFF Advisory Board, as a cyber policy researcher at Harvard University, and as an international security researcher in New America. Last year, she was a Fulbright scholar in the field of cyber security. You can find her on Twitter @Tarah or on her website: https://tarah.org/.  

If you have any feedback on this episode, please send an email to podcast@eff.org. You can also find a copy of the episode on the Internet Archive.

Below, you will find legal resources—including important cases, books, and abstracts discussed in the podcast—as well as full transcripts of audio.

 Computer Fraud and Abuse Act (CFAA):

Tarah: So in 2010, I got married for the first time. While walking down the street one night, I saw the front door of a local bridal shop, which was still open in the middle of the night. There was no one around, it just seemed that someone might think that the door might be closed and ruled out behind. So me, I stuck my head out and looked around, hey, is anyone here? I closed the door, locked it until I could feel it rattling, then locked it from the inside, and closed it. I left a note on the door and said, hey, guys just wanted to let you know that your door is open in case there is a problem with the lock.

Then I left. Never heard from them again. There is no thank you, no thank you, nothing at all. This is indeed what many security researchers found when they tried to submit third-party reports of security vulnerabilities to the company, they were just ignored. You know, this is a bit annoying. 

Danny: That's Tara Wheeler. She is our guest this week on how to fix the Internet. We will discuss with her to coordinate vulnerability disclosures and what to do if you find a bug in the software that needs to be fixed, and how someone like Tarah can keep you safe. 

Cindy: This is Cindy Cohen. Welcome to the Electronic Frontier Foundation podcast How to Fix the Internet to help you understand how we can make our digital future better.  

Danny: Welcome Tara. Thank you very much for joining us.

Tara: Thank you very much for inviting me. Cindy is an incredible joy. Thank you very much, Danny.

Cindy: Tara, you are a cyber policy researcher at Harvard and an international cyber security researcher in the new United States. You were a Fulbright scholar in cyber security last year. To our delight, you are also a member of the EFF Advisory Board. I just want to say that you know a lot about these things, and most importantly, you told us a story about walking through the bridal shop, locked the door and did a good thing, and then there was no response. Can you explain how this story relates to the collaborative vulnerability disclosure world you live in?

Tarah: The absolutely coordinated vulnerability disclosure is a process involving multiple stakeholders. This transformation into normal humans means that the company needs a way to get information from people who want to tell them that something is wrong.

Well, the problem is that companies often don't know that they should develop an open door policy for third-party security researchers to let them know what went wrong. On the other hand, security researchers need to provide this information to the company, rather than starting from what sounds like a ransom demand.

Tarah: Let me give you an example. If you find a vulnerability, for example, I don’t know, a cross-site scripting problem in the company’s website, you may try to let the company know that there is a problem, they have a security vulnerability that can be easily exploited, and they are exposed to the Internet.

Well, the question for many people is, if you don’t know how to provide this information to someone in the company, what would you do. The first step in the industry standard is to ensure that you use the alias security@company.com as your email address in order to receive reports from third-party researchers. We in the industry and the community just want this email alias to work.

Hope you can check their website and find a way to contact technical support staff or people in the security team and let them know what went wrong. However, many companies have problems accepting these reports and acknowledging them, because honestly, there are two different tracks in the world. There are some information communities where security researchers rely on gratitude and money to operate. Other companies rely on responsibility and public brand management to operate. Therefore, when a company receives a third-party report about a security breach, the company needs to go through a classification process.

I’m here to tell you that, as someone who has done this internally and externally, when your company receives a vulnerability report, unless you can fix the vulnerability quickly, you may not want to admit it. Sometimes you There will be pressure from within the company, especially from lawyers, because this can be regarded as an admission that the company has classified it, and we will fix the loopholes in time. Let me assure you that the timely approach looks really different to security researchers and internal legal counsel.

Danny: If anyone finds a loophole. And report it to the company. The company will either blow it away or try to cover it up. What are the consequences for ordinary users?

For example, how does it affect me?

Tarah: If a security researcher reports a vulnerability to the company and the company never fixes it, how does this affect ordinary people who are users of the product?

Tarah: Well, I don't know you, but when Equifax decided not to patch a loophole in their server, I was one of the 143 million people who lost my personal information and credit history.

There is a story behind every data breach. This story is either that people don’t know what’s wrong, or people know what’s wrong, but they don’t prioritize solving it, and don’t understand how much impact it will have on them and consumers and those with data they have stored.

Danny: You talked about a coordinated, fragile disclosure, so who is coordinating and what is coordinating? 

Tarah: When we talk about multiple stakeholders of a vulnerability, we are not just talking about the people who discovered it and the people who need to fix it, but also those who defend possible consumers who are affected by it.

This is how you will encounter situations such as FTC and have one or two conversations with companies that have repeatedly failed to fix major vulnerabilities in their systems while protecting consumer data. As a good example, EFF often wants to protect a larger population, not just researchers, not just people working in the company, but all people affected by the vulnerability. Therefore, when security researchers discover problems and report them to the company, the company’s incentives need to be consistent with their idea that they should fix the vulnerability, rather than suing the researchers to remain silent. 

Cindy: EFF played a very important role in this regard. I remember the bad days of the past when security researchers were knocked on the door by law enforcement agencies almost immediately, or, you know, the prosecuted program service boldly told the company that it had a security problem.

I really like the way the world evolves is that we now often have this kind of conversation in the software industry. But you know, computers are everywhere now. They are found in cars and refrigerators, and in medical equipment in our bodies, such as insulin pumps and heart monitors.

I want to know if you know the situation in other industries that are now also engaged in the computer software industry.

Tarah: I recently joined the IoT Security Group of the Organization for Economic Cooperation and Development. I am talking to someone from the Australian Consumer Product Safety Commission or similar agency. I am here to tell you that the use of computers in everything is a fascinating question, because the whole point of view of consumer product security personnel has little to do with whether there are software vulnerabilities in the computer they use, but whether the computer is installed or not. , Processing temperature 

So when we started talking about putting computers in everything, when we were talking about temperature differences, changing the temperature in refrigerators and freezers, we started talking about things that can kill people, regardless of whether the sous vide cooker has proper readings or not. , Is the kind of weakness that can kill.

Danny: Do you think software is particularly different from other disciplines where we solve security problems? It's like the bridge no longer collapses. right. Is there something we are doing on a bridge made with software, or is it just hard to do with software?

Tarah: First, the software is difficult. Second, I like an industry that I want to develop rather than a bridge, and that is aviation. I will tell you how aviation’s work on information security is different from ours: they have an irreproachable post-analysis to discuss how and why what happened when an accident happened. They use a multi-stakeholder approach, allowing multiple different people to examine the cause of the incident. This is of course what we are talking about, about the NTSB and FAA, the National Transportation Safety Board and the Federal Aviation Administration. In the aviation field, the exchange of knowledge between pilots is eternal, continuous, expected, and regulated, and the expectation of those of us who work in the aviation field is that we will always tell others what we did wrong. There is a culture of always discussing, exposing our own mistakes, and talking about how others avoid them in some way, that is, there is no punishment for doing so. Everyone there will tell you what they did wrong as a pilot to persuade you not to make the same mistake. This is not the case in the field of information security. We hide all our mistakes as quickly as possible, burying them under a pile of liability and terrible email subject line lawyers, client secrets. This is the culture of secrecy that always exists here, and this is why we encountered this problem.

: "How to Fix the Internet" is supported by the Alfred P. Sloan Foundation Public Understanding Science Project, which enriches people by understanding our increasingly technological world more acutely and depicting the complex human nature of scientists, engineers, and mathematicians life. 

Cindy: I ​​think this is really a good place to learn a little bit about how we solve it, because you know, one of the reasons lawyers are so worried is that liability risks are sometimes exaggerated, but sometimes they are true. And, the same thing is true on the other side, that is, laws are made to bring risks to security researchers, the risks of telling someone the truth, right. On the other hand, the threat of liability does not link the company’s incentives with the measures that are most beneficial to society. To me, this does not mean that we protect the company from any liability. When the plane falls from the sky, we still want the company to be responsible for the damage they cause. But the responsibility part does not prevent learning from mistakes. I think in other parts of the software world, we see obstacles in the responsibility part. Some of them are changes we can make to laws and policies, but some of them are I think we need to make some changes in how we deal with these issues.

Tarah: The lawyer in the recent case of a ransomware attack on a medical institution, I think it was Southern, who just caused a patient to file a lawsuit. The patient's daughter died after receiving insufficient care while in the hospital and experienced a cyber attack. She has now filed a lawsuit, and people who are following the process worry: People suing the hospital for not revealing that they were under a cyber attack or failing to provide them with proper care during the ransomware attack are unlikely to cause the hospital to disclose more information about what they are doing. The fact of experiencing network outages. This may cause hospitals to shut out patients during a ransomware attack.

Now this is the wrong lesson to be learned. Therefore, when I look at the way we think about our responsibility for critical infrastructure, the motive for being open and honest about the fact that hospitals are under cyberattack is completely wrong. The hospital chose not to pay the ransom. It is important to note that they may never be able to retrieve the records or restore the ability to network, or may not rescue the woman’s daughter in time. But at the same time, we cannot tell hospitals that the lesson they need to learn from experiencing cyber attacks and litigation is that they need to shut up more and pay more ransoms. We cannot tell the organization that this is the right way to deal with this fear of responsibility.

Cindy: One of the ways we help solve this problem in California is the data breach notification law, which is in California. If a data breach occurs, it will affect many people around the world, because so many companies are located in California, the company’s liability risk will not be reduced to zero-because I think it is very important if people can still sue them. The children died due to a data breach. Just like you can’t eliminate all responsibilities, but if you tell people about the data breach in time, responsibilities will change. So we can do something, you know, the national data breach notification law is one of the things we can consider. We can set up these incentives so that we can shift the company and risk assessment to talking about these things instead of not talking about it.

California law is a good example, but you know that many federal cybersecurity laws are about, well, you have to tell the government, but you won't tell anyone else. 

Tarah: There is a lack of qualified senior cybersecurity professionals. This is the legacy of the Computer Fraud and Abuse Act of 1986. The government did its own thing in this matter because those who care enough about trying to comply with the law are also curious to start information security research, start offensive security research and try to help people understand what their weaknesses are and are afraid of CFAA. CFAA stopped independent safety research to the point where I think most people still don't really understand it. So, the result is that we end up forming a culture of fear among those who are trying to be ethical and law-abiding, and, in the United States, not only do people have to escape someone from the federal law enforcement agency, but this situation is enough for a long time. Make a quick profit and then exit.

Danny: It's not just in America, right? I mean, one of the most disappointing things is that I think the CFAA has been exported all over the world, and no matter where they live, people will become criminals rather than good citizens, which faces exactly the same challenges.

Tarah: We are studying a law that should have a sunset clause from the beginning, and a law that was enacted and implemented and supported by a judge. He thinks you can whistle to a public phone to activate nuclear weapons. Look, people, computers are not magic sky fairy tale boxes full of fairy dust, they can change the world in some way. I mean, unless you are mining the blockchain, because we all know that it is magical. The same applies to computers that are not magic. Our legal system has some flaws. In the United States, CFA is often used to lie on it for prosecution. We have enacted laws to describe what is fraud and what is theft, and saying that it occurs through a computer does not make it not fraud. It will not be worse than fraud. This is just a different medium. 

We all know what is right and what is wrong. right. So add a law that says if you use a magic wizard box to commit a crime, then things get worse. It doesn't make any sense to those of us who use computers every day. 

Cindy: I ​​think one of the lessons of the CFAA is that we probably shouldn't make laws based on Matthew Broderick's 80s movies. right. The CFAA was obviously passed after President Reagan saw the war game. This is a very interesting movie, but it is actually not realistic. So please go ahead.

Tarah: So the essence of CFAA, the true story back here, is that it is used by people who don’t understand it to prosecute people who never intend to break the law, or if they do, we already have a way to cover the issue Of the law. Therefore, CFAA is now mainly used for what we can see in the industry to prevent employees of large companies from withdrawing from establishing competing businesses. 

This is the practical use, quietly going on behind the scenes of CFAA. Another very public use is to hunt down people who have nothing to do with the law. They may be bad guys. In the recent US vs. Van Buren case, a police officer who colluded with criminals to harass women and used departmental computers to check information. The problem we have here is that this police officer was charged under the CFAA. Now he is not a good person, but we have named the law he violated. This is an abuse of public trust, with fraud and theft. 

The problem we have here is that the crimes he committed are exactly the same, whether he is looking up on his laptop or looking up the information of these women by going back and looking at the built file cabinets. They are completely out of paper and wood. To the department.

Therefore, we are enacting a law to prosecute information security researchers, employees who leave the company, or unfairly prosecute bad guys who committed crimes we already know. We don't need CFAA. We already know when people do right and wrong things. We have made laws for these things, but judges can easily believe that terrible things happen on computers because they don't understand how they work.

Cindy: Let's switch a little bit, you know what it would look like if we got it right. So we have already talked about one, and that is that computer fraud and abuse are no longer used to scare people into doing good deeds. The idea that we have a global network of cooperation and people who are committed to building our safety network is what we accept, not what we discourage. We need to accept this at the individual level, CFAA and corporate level, adjust corporate incentives, and then at the government level, where we encourage the government, do not hoard loopholes, and do not become a large buyer in the private market, but transfer this information Provide it to the company so that the company can fix it and make things better.

If we get it right, what do you think it will look like, Tara?

Tarah: If we get it right, security researchers who report vulnerabilities to the company will be appropriately compensated. This does not mean that security researchers report an undiscovered but small vulnerability, they should get a Porsche every time.

This does mean that researchers who try to help should at least be grateful. When you find a reported vulnerability, you report it to a company. This is a company killer. It is completely critical that the entire system and the entire company may be paralyzed. You should be appropriately compensated.

Cindy: We need to combine the incentive to do the right thing with the incentive to do the wrong thing, I heard you say yes.

Tara: That's right. We need to adjust these incentives.

Cindy: That's just chilling, right? Because what if these security researchers sell the vulnerability to people who want to sabotage our elections? I can't imagine anything more important than safety, correctness, and protection, and then our basic voting rights and the right to live in an effective democracy. When you are dealing with something so important to a functioning society, we shouldn’t rely on the kindness of security researchers to tell, you know, good people about this, not telling bad people that we need to set up incentives, it’s always To push them in the direction of good people, and to monitor our health and protect our votes, these incentives should be the strongest. 

Tara: Of course. The same way of thinking that lets you discover vulnerabilities afterwards allows you to see where they were created. Unfortunately, first, there are not enough companies to conduct proper product safety reviews early in the development process. why? Because it is expensive and two, there are not enough qualified product safety personnel, reviewers and developers. They are just that their numbers are not enough, partly because the company does not welcome this situation in many cases. At this time, the field of cyber security is exploding with people who want to engage in cyber security. However, at the highest level, there is no doubt that there is a serious lack of diversity in this field. For women, people of color, and queer people, when they can't see that they are at the top of the company, it is difficult to see that they are at the top of the company, right. They don't think they have succeeded in this field, although I am here to tell you that, in contrast, the salary in cyber security is really good.

So I invite all my friends to attend the training class and start dismantling computers, because if you like puzzles and want to get a good salary, this is a good field, but the lack of diversity and openness completely leads to us. The number of people needed in this field, we must, must, and must start to think about our views of cybersecurity experts in a different way.

Cindy: There seems to be a lack of imagination about what hackers look like or should be. Tarah, you don't look like that image... We really need to create better and broader models to show the looks and voices of people who care about computer security.

Tara: We have. What you really want is a sense of security, a sense of security that you yourself hire a smart, educated person to tell you that everything will be okay. This is the weakness of human nature being introduced into network security time and time again. If you don’t look like the expert they think it is, it’s difficult to assure someone that you are an expert. This is the barrier for women, people of color, and people of color in the field of cybersecurity today. They have trusted experts. Breaking through and breaking through that barrier is just a question.

Cindy: This is a community project. It is a kind of realization for me that we are on other people's computers all day, and sometimes other people are also on our computers. When someone walks by and says, "Hey, I think you have a security problem here. The correct thing to do is to thank them. Not attack them, let alone throw the criminal law at them.  

Tarah: If you are a company insider, I want you to send an email to security@yourcompany.com, whatever it is. I want you to know what happened. Will it rebound? Will it be given to people who no longer work for the company? Really, as I discovered before, go to the CEO's chief of staff. So it's like, look at where it goes, because this is how people try to talk to you and make your world a better place. If no one has checked the mailbox for a while, maybe you can open it, you know, close one eye and stare at it behind someone like a solar eclipse observer. Because it will explode.

Cindy: I ​​can do this all day, Tara. It is fascinating to talk to you and understand your point of view, because in this conversation, you are indeed in many different places. And I think that for people like you, if we have more people listening to Tarah, we can solve the problem. So thank you very much for coming to talk to us and provide us with direct stories about how this is going on in the field.

Tarah: It's so kind of you to invite me. Cindy and Danny, I just, I want to hang out with you, but, you know, drink inappropriate morning wine with you, and then yell that everything on the Internet is broken. I mean, it’s a wonderful pastime, and it’s also a wonderful opportunity to make the world a better place. Just recognize that we are connected to each other. These, fixing one thing in one place does not only affect that one thing. It affects everyone. It's great to be with you and have the opportunity to make things better. 

Cindy: Well, that's great. You know, Tarah's passion and her love for computer security and security research will show through. However, it is contagious. It really makes me think we can do a lot of things here to make things better. You know, what really impressed me is that you know that I have been the enemy of computer fraud and abuse for a long time, but she really used it as how it made security research frightened and shuddered. You know, it will hurt our country and the world in the end. But what she said is very specific, and that is that it creates a culture of fear among those who try to be moral and law-abiding. I mean, this really should keep us from catching a cold. You know, the good news is that we got some relief from the Supreme Court Van Buren case we talked about, but there is still a lot to do. 

Danny: I think she did successfully convey the stakes here and the impact of such loopholes on humans. It's not just that your credit rating has dropped because of personal data leaks. It's about how children in the hospital will die if people don't solve the security breach. 

Cindy: Another thing I really like is that Tarah is really focused on adjusting financial incentives, a bit like both parties, penalizing companies that don’t fix or talk about security vulnerabilities, and compensate the security researchers who do it for us all to find them Favor. You know, what I like is that you often talk about the legal norms first identified by Larry Lessig and the four levers of change in the market. And this leverage is very concerned about the market and how we adjust financial incentives to make things better. 

Danny: Yes. I think people have become very nihilistic about solving computer security problems. And I think Tarah quoted a practical, real, pragmatic inspiration about how you can improve it, which is very positive. That is the aviation industry, where you have a community, cross-company, cross-country, and work internationally in this very transparent and methodical way to defend against problems with very similar models, right? The smallest mistake will have huge consequences, and people’s lives are at stake. So everyone must work together. I like the fact that there is something in the real world that allows our utopian vision to be based on it.

Cindy: Another thing I really appreciate is how Tarah clearly shows that we are only in an online world. We spend a lot of time communicating with each other on other people's computers, and the way we fix it is to recognize this and keep everything in line with the online world. Accepting the fact that we are all connected is the way forward.

Thank you Tarah Wheeler for joining us, allowing us to learn more about her world.

DANNY: If you like what you hear, please follow us on your favorite podcast player. We have more episodes, and smart people will tell you how we fix the Internet. 

The performance music was composed by BeatMower's Nat Keefe and Reed Mathis. 

"How to Fix the Internet" was supported by the Alfred P. Sloan Foundation Public Understanding Science and Technology Program. This is Danny O'Brien.

Cindy: This is Cindy Cohen. Thank you so much for joining us today. Until next time.

Podcast Episode 105 Law enforcement wants to force companies to build backdoors for software that runs on your phones, tablets, and other devices. This will allow easier access to the information on your device and the information flowing through it, including your private communications with others, websites...

San Francisco-Are you troubled when Twitter deletes posts from people or organizations you follow? Worried about protecting yourself and your community from surveillance? The Electronic Frontier Foundation (EFF) has provided you with help, and today launched the first season of How to Fix Internet Podcasts, where the dialogue can be...

Episode 006 of How EFF Fixes the Internet, Chris Lewis joined EFF hosts Cindy Cohn and Danny O'Brien, who discussed how our access to knowledge is increasingly subject to click-through agreements that prevent users Owning things like books and music, and how this breaks legal principles...

How EFF Fixes the Internet Episode 005 Abi Hassen and EFF hosts Cindy Cohn and Danny O'Brien discuss the rise of facial recognition technology, how this increasingly powerful recognition tool will eventually fall into the hands of law enforcement, and what it means For the future of

Episode 004 of How the EFF Fixes the Internet Corey Doctoro joins EFF hosts Cindy Cohn and Danny O'Brien, who discuss how large established technology companies like Apple, Google, and Facebook can prevent interoperability To suppress competition and control their users, and how do we solve this problem...

How EFF Fixes the Internet Episode 003 Juana Musa and EFF hosts Cindy Cohn and Danny O'Brien discuss how third-party principles undermine our Fourth Amendment privacy rights when we use digital services, and how recent court victories have become There are hopeful signs that we can...